Bounds for f-Divergences Under Likelihood Ratio Constraints
نویسندگان
چکیده
منابع مشابه
Proofs for empirical likelihood with general f -divergences
We study extensions of empirical likelihood where the log likelihood ratio is replaced withgeneral f -divergences (which we call empirical divergences). First, we give a novel, elementaryproof forχd-calibration that does not use duality. We then see how to rigorously prove coveragerates for empirical divergence confidence regions by going beyond the asymptotic expansion for<...
متن کاملOn Improved Bounds for Probability Metrics and $f$-Divergences
Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...
متن کاملSharp Inequalities for $f$-divergences
f -divergences are a general class of divergences between probability measures which include as special cases many commonly used divergences in probability, mathematical statistics and information theory such as Kullback-Leibler divergence, chi-squared divergence, squared Hellinger distance, total variation distance etc. In this paper, we study the problem of maximizing or minimizing an f -dive...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applications of Mathematics
سال: 2003
ISSN: 0862-7940,1572-9109
DOI: 10.1023/a:1026054413327